Inexact Newton Regularization Using Conjugate Gradients as Inner Iteration

نویسنده

  • Andreas Rieder
چکیده

In our papers [Inverse Problems, 15, 309-327,1999] and [Numer. Math., 88, 347-365, 2001] we proposed algorithm REGINN being an inexact Newton iteration for the stable solution of nonlinear ill-posed problems. REGINN consists of two components: the outer iteration, which is a Newton iteration stopped by the discrepancy principle, and an inner iteration, which computes the Newton correction by solving the linearized system. The convergence analysis presented in both papers covers virtually any linear regularization method as inner iteration, expecially Landweber iteration, ν-methods as well as Tikhonov–Phillips regularization. In the present paper we prove convergence rates for REGINN when the conjugate gradient method, which is nonlinear, serves as inner iteration. Thereby we add to a convergence analysis of Hanke who investigated REGINN furnished with the conjugate gradient method before [Numer. Funct. Anal. Optim., 18, 971-993, 1997]. By numerical experiments we illustrate that the conjugate gradient method outperforms the ν-method as inner iteration.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Towards a general convergence theory for inexact Newton regularizations

We develop a general convergence analysis for a class of inexact Newtontype regularizations for stably solving nonlinear ill-posed problems. Each of the methods under consideration consists of two components: the outer Newton iteration and an inner regularization scheme which, applied to the linearized system, provides the update. In this paper we give a novel and unified convergence analysis w...

متن کامل

On the order optimality of the regularization via inexact Newton iterations

Inexact Newton regularization methods have been proposed by Hanke and Rieder for solving nonlinear ill-posed inverse problems. Every such a method consists of two components: an outer Newton iteration and an inner scheme providing increments by regularizing local linearized equations. The method is terminated by a discrepancy principle. In this paper we consider the inexact Newton regularizatio...

متن کامل

Signal Reconstruction in Sensor Arrays Using Temporal-Spatial Sparsity Regularization

We propose a technique of multisensor signal reconstruction based on the assumption, that source signals are spatially sparse, as well as have sparse [wavelet-type] representation in time domain. This leads to a large scale convex optimization problem, which involves l1 norm minimization. The optimization is carried by the Truncated Newton method, using preconditioned Conjugate Gradients in inn...

متن کامل

Quasi-Newton Bundle-Type Methods for Nondifferentiable Convex Optimization

In this paper we provide implementable methods for solving nondifferentiable convex optimization problems. A typical method minimizes an approximate Moreau–Yosida regularization using a quasi-Newton technique with inexact function and gradient values which are generated by a finite inner bundle algorithm. For a BFGS bundle-type method global and superlinear convergence results for the outer ite...

متن کامل

On nonlinear generalized conjugate gradient methods

where F (ξ) is a nonlinear operator from a real Euclidean space of dimension n or Hilbert space into itself. The Euclidean norm and corresponding inner product will be denoted by ‖·‖1 and (·, ·)1 respectively. A general different inner product with a weight function and the corresponding norm will be denoted by (·, ·)0 and ‖ · ‖ respectively. In the first part of this article (Sects. 2 and 3) w...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • SIAM J. Numerical Analysis

دوره 43  شماره 

صفحات  -

تاریخ انتشار 2005